A Recurrent Neural Network that Learns to Count
نویسندگان
چکیده
منابع مشابه
A Recurrent Neural Network that Learns to Count
Parallel distributed processing (PDP) architectures demonstrate a potentially radical alternative to the traditional theories of language processing that are based on serial computational models. However, learning complex structural relationships in temporal data presents a serious challenge to PDP systems. For example, automata theory dictates that processing strings from a context-free langua...
متن کاملA recurrent network that learns to pronounce English text
Previous attempts to derive connectionist models for text-tophoneme conversion – such as NETtalk and NETspeak – have generally used pre-aligned training data and purely feedforward networks, both of which represent simplifications of the problem. In this work, we explore the potential of recurrent networks to perform the conversion task when trained on non-aligned data. Initially, our use of a ...
متن کاملA 'Neural' Network that Learns to Play Backgammon
We describe a class of connectionist networks that have learned to play backgammon at an intermediate-to-advanced level. TIle networks were trained by a supervised learning procedure on a large set of sample positions evaluated by a human expert. In actual match play against humans and conventional computer programs, the networks demonstrate substantial ability to generalize on the basis of exp...
متن کاملMeter as Mechanism : A Neural Network that Learns
One kind of prosodic structure that apparently underlies both music and some examples of speech production is meter. Yet detailed measurements of the timing of both music and speech show that the nested periodicities that deene metrical structure can be quite noisy in time. What kind of system could produce or perceive such variable metrical timing patterns? And what would it take to be able to...
متن کاملGS : A Network that Learns Important Features
GS is a network for supervised inductive learning from examples that uses ideas from neural networks and symbolic inductive learning to gain benefits of both methods. The network is built of many simple nodes that learn important features in the input space and then monitor the ability of the features to predict output values. The network avoids the exponential nature of the number of features ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Connection Science
سال: 1999
ISSN: 0954-0091,1360-0494
DOI: 10.1080/095400999116340